Recent #Data center news in the semiconductor industry

5 months ago
➀ Major tech companies have significantly increased capital expenditures in recent years to acquire AI chips and build data centers. ➁ Global investment in data centers has recently slowed, raising concerns about the sustainability of the AI boom. ➂ AI phones may sustain the chip industry amid the decline in data center investment.
AIChip IndustryData centerNVIDIASupply chain
6 months ago
➀ Fujitsu unveils its Monaka processor, a 144-core Armv9-based chip designed for future data centers; ➁ The chip is built on TSMC's N2 process and features a CoWoS system-in-package with SRAM tiles and hybrid copper bonding; ➂ Fujitsu aims for superior energy efficiency by 2026-2027, using air cooling.
2nm3D-stacking5nmAMD EPYCArmCPUChipletsData centerEnergy efficiencyFujitsuIntel XeonMonaka
7 months ago
➀ Dell has launched the new PowerEdge XE9712 with NVIDIA GB200 NVL72 AI servers, offering 30x faster real-time LLM performance over the H100 AI GPU; ➁ The system features 72 x B200 AI GPUs connected with NVLink technology, providing lightning-fast connectivity; ➂ Dell highlights the liquid-cooled system for maximizing datacenter power utilization and rapid deployment of AI clusters.
AIData centerDellGPULLMNVIDIAPerformanceTraininginference
10 months ago
➀ Kioxia demonstrated an optical interface SSD at FMS, developed with funding from NEDO to reduce data center power consumption by up to 40%. ➁ The optical interface allows for data transfer over long distances, enabling storage to be kept in separate rooms with minimal cooling requirements. ➂ The demonstration showed a slight loss in IOPS performance but significant latency advantages over traditional copper network links.
Data centerKIOXIASSD
10 months ago
1. Micron Technology introduces the Micron 9550 NVMe SSD series, boasting the world's fastest PCIe 5.0 data center SSDs; 2. These SSDs offer maximum sequential read speeds of 14GB/s and write speeds of 10GB/s, with up to 67% higher performance than competitors; 3. They are designed for AI workloads, with random read speeds of up to 3.3 million IOPS and write speeds of 400,000 IOPS, making them suitable for large language models and graph neural network training.
AIData centerSSD
10 months ago
1. PAM4 SerDes technology significantly enhances data throughput and power efficiency for AI and data center applications; 2. The technology supports various reach requirements, from long to short, ensuring reliable and high-speed data transmission; 3. Cadence's advanced SerDes solutions and involvement in the Ultra Ethernet Consortium highlight ongoing innovations in Ethernet technology.
AICadenceData center